- Sum Of Squares
A statistical technique used in regression analysis. The sum of squares is a mathematical approach to determining the dispersion of data points. In a regression analysis, the goal is to determine how well a data series can be fitted to a function which might help to explain how the data series was generated. The sum of squares is used as a mathematical way to find the function which best fits (varies least) from the data.

In order to determine the sum of squares the distance between each data point and the line of best fit is squared and then all of the squares are summed up. The line of best fit will minimize this value.

There are two methods of regression analysis which use the sum of squares: the linear least squares method and the non-linear least squares method. Least squares refers to the fact that the regression function minimizes the sum of the squares of the variance from the actual data points. In this way, it is possible to draw a function which statistically provides the best fit for the data. A regression function can either be linear (a straight line) or non-linear (a curving line).

*Investment dictionary.
Academic.
2012.*

### Look at other dictionaries:

**Sum of squares**— is a concept that permeates much of inferential statistics and descriptive statistics. More properly, it is the sum of the squared deviations . Mathematically, it is an unscaled, or unadjusted measure of dispersion (also called variability). When … Wikipedia**Lack-of-fit sum of squares**— In statistics, a sum of squares due to lack of fit, or more tersely a lack of fit sum of squares, is one of the components of a partition of the sum of squares in an analysis of variance, used in the numerator in an F test of the null hypothesis… … Wikipedia**Total sum of squares**— The value of the total sum of squares (TSS) depends on the data being analyzed and the test that is being done.In statistical linear models, (particularly in standard regression models), the TSS is the sum of the squares of the difference of the… … Wikipedia**Explained sum of squares**— In statistics, an explained sum of squares (ESS) is the sum of squared predicted values in a standard regression model (for example y {i}=a+bx {i}+epsilon {i}), where y {i} is the response variable, x {i} is the explanatory variable, a and b are… … Wikipedia**Residual sum of squares**— In statistics, the residual sum of squares (RSS) is the sum of squares of residuals. It is the discrepancy between the data and our estimation model. The smaller this discrepancy is, the better the estimation will be.:RSS = sum {i=1}^n (y i f(x… … Wikipedia**Residual Sum Of Squares - RSS**— A statistical technique used to measure the amount of variance in a data set that is not explained by the regression model. The residual sum of squares is a measure of the amount of error remaining between the regression function and the data set … Investment dictionary**Sum of two squares**— In mathematics, sums of two squares occur in a number of contexts:* The Pythagorean theorem says that the square on the hypotenuse of a right triangle is equal in area to the sum of the squares on the legs * Brahmagupta–Fibonacci identity says… … Wikipedia**Ordinary least squares**— This article is about the statistical properties of unweighted linear regression analysis. For more general regression analysis, see regression analysis. For linear regression on a single variable, see simple linear regression. For the… … Wikipedia**Least squares**— The method of least squares is a standard approach to the approximate solution of overdetermined systems, i.e., sets of equations in which there are more equations than unknowns. Least squares means that the overall solution minimizes the sum of… … Wikipedia**Linear least squares/Proposed**— Linear least squares is an important computational problem, that arises primarily in applications when it is desired to fit a linear mathematical model to observations obtained from experiments. Mathematically, it can be stated as the problem of… … Wikipedia